Kernel Coding: General Formulation and Special Cases

نویسندگان

  • Mehrtash Tafazzoli Harandi
  • Mathieu Salzmann
چکیده

Representing images by compact codes has proven beneficial for many visual recognition tasks. Most existing techniques, however, perform this coding step directly in image feature space, where the distributions of the different classes are typically entangled. In contrast, here, we study the problem of performing coding in a high-dimensional Hilbert space, where the classes are expected to be more easily separable. To this end, we introduce a general coding formulation that englobes the most popular techniques, such as bag of words, sparse coding and locality-based coding, and show how this formulation and its special cases can be kernelized. Importantly, we address several aspects of learning in our general formulation, such as kernel learning, dictionary learning and supervised kernel coding. Our experimental evaluation on several visual recognition tasks demonstrates the benefits of performing coding in Hilbert space, and in particular of jointly learning the kernel, the dictionary and the classifier.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multitask Multiple Kernel Learning

We present a general regularization-based framework for Multi-task learning (MTL), in which the similarity between tasks can be learned or refined using `pnorm Multiple Kernel learning (MKL). Based on this very general formulation (including a general loss function), we derive the corresponding dual formulation using Fenchel duality applied to Hermitian matrices. We show that numerous establish...

متن کامل

A Note on Metropolis - Hastings Kernels for General State

The Metropolis-Hastings algorithm is a method of constructing a reversible Markov transition kernel with a speci ed invariant distribution. This note describes necessary and su cient conditions on the candidate generation kernel and the acceptance probability function for the resulting transition kernel and invariant distribution to satisfy the detailed balance conditions. A simple general form...

متن کامل

Framework for Multi-task Multiple Kernel Learning and Applications in Genome Analysis

We present a general regularization-based framework for Multi-task learning (MTL), in which the similarity between tasks can be learned or refined using `p-norm Multiple Kernel learning (MKL). Based on this very general formulation (including a general loss function), we derive the corresponding dual formulation using Fenchel duality applied to Hermitian matrices. We show that numerous establis...

متن کامل

Learning Tensors in Reproducing Kernel Hilbert Spaces with Multilinear Spectral Penalties

We present a general framework to learn functions in tensor product reproducing kernel Hilbert spaces (TP-RKHSs). The methodology is based on a novel representer theorem suitable for existing as well as new spectral penalties for tensors. When the functions in the TP-RKHS are defined on the Cartesian product of finite discrete sets, in particular, our main problem formulation admits as a specia...

متن کامل

Local structure preserving sparse coding for infrared target recognition

Sparse coding performs well in image classification. However, robust target recognition requires a lot of comprehensive template images and the sparse learning process is complex. We incorporate sparsity into a template matching concept to construct a local sparse structure matching (LSSM) model for general infrared target recognition. A local structure preserving sparse coding (LSPSc) formulat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1409.0084  شماره 

صفحات  -

تاریخ انتشار 2014